翻訳と辞書 |
Entropy compression : ウィキペディア英語版 | Entropy compression In mathematics and theoretical computer science, entropy compression is an information theoretic method for proving that a random process terminates, originally used by Robin Moser to prove an algorithmic version of the Lovász local lemma.〔.〕〔.〕 ==Description== To use this method, one proves that the history of the given process can be recorded in an efficient way, such that the state of the process at any past time can be recovered from the current state and this record, and such that the amount of additional information that is recorded at each step of the process is (on average) less than the amount of new information randomly generated at each step. The resulting growing discrepancy in total information content can never exceed the fixed amount of information in the current state, from which it follows that the process must eventually terminate. This principle can be formalized and made rigorous using Kolmogorov complexity.〔.〕
抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Entropy compression」の詳細全文を読む
スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース |
Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.
|
|